Deep Learning-based Sentence Embeddings using BERT for Textual Entailment

نویسندگان

چکیده

This study directly and thoroughly investigates the practicalities of utilizing sentence embeddings, derived from foundations deep learning, for textual entailment recognition, with a specific emphasis on robust BERT model. As cornerstone our research, we incorporated Stanford Natural Language Inference (SNLI) dataset. Our emphasizes meticulous analysis BERT’s variable layers to ascertain optimal layer generating embeddings that can effectively identify entailment. approach deviates traditional methodologies, as base evaluation direct simple comparison norms, subsequently highlighting geometrical attributes embeddings. Experimental results revealed L2 norm drawn specifically 7th layer, emerged superior in detection compared other setups.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Joint Learning of Sentence Embeddings for Relevance and Entailment

We consider the problem of Recognizing Textual Entailment within an Information Retrieval context, where we must simultaneously determine the relevancy as well as degree of entailment for individual pieces of evidence to determine a yes/no answer to a binary natural language question. We compare several variants of neural networks for sentence embeddings in a setting of decision-making based on...

متن کامل

Recognizing Textual Entailment in Twitter Using Word Embeddings

In this paper, we investigate the application of machine learning techniques and word embeddings to the task of Recognizing Textual Entailment (RTE) in Social Media. We look at a manually labeled dataset (Lendvai et al., 2016) consisting of user generated short texts posted on Twitter (tweets) and related to four recent media events (the Charlie Hebdo shooting, the Ottawa shooting, the Sydney S...

متن کامل

Arabic Textual Entailment with Word Embeddings

Determining the textual entailment between texts is important in many NLP tasks, such as summarization, question answering, and information extraction and retrieval. Various methods have been suggested based on external knowledge sources; however, such resources are not always available in all languages and their acquisition is typically laborious and very costly. Distributional word representa...

متن کامل

Refining Raw Sentence Representations for Textual Entailment Recognition via Attention

In this paper we present the model used by the team Rivercorners for the 2017 RepEval shared task. First, our model separately encodes a pair of sentences into variable-length representations by using a bidirectional LSTM. Later, it creates fixed-length raw representations by means of simple aggregation functions, which are then refined using an attention mechanism. Finally it combines the refi...

متن کامل

Dependency-based Textual Entailment

This paper studies the role of dependency information for the task of textual entailment. Both the Text and Hypothesis of an entailment pair are mapped into sets of dependencies and a score is computed that measures the similarity of the two sets. Based on the score an entailment decision is made. Two experiments are conducted to measure the impact of dependencies on the entailment task. In one...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Advanced Computer Science and Applications

سال: 2023

ISSN: ['2158-107X', '2156-5570']

DOI: https://doi.org/10.14569/ijacsa.2023.01408108